Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
نویسندگان
چکیده
منابع مشابه
Super linear Projected Structured Exact Penalty Secant Methods for Constrained Nonlinear Least Squares
We present an exact penalty approach for solving constrained nonlinear least squares problems, using a new projected structured Hessian approximation scheme. We establish general conditions for the local two-step Q-superlinear convergence of our given algorithm. The approach is general enough to include the projected version of the structured PSB, DFP and BFGS formulas as special cases. The num...
متن کاملCondition Numbers for Structured Least Squares Problems
This paper studies the normwise perturbation theory for structured least squares problems. The structures under investigation are symmetric, persymmetric, skewsymmetric, Toeplitz and Hankel. We present the condition numbers for structured least squares. AMS subject classification (2000): 15A18, 65F20, 65F25, 65F50.
متن کاملA secant method for nonlinear least-squares minimization
Quasi-Newton methods have played a prominent role, over many years, in the design of effective practical methods for the numerical solution of nonlinear minimization problems and in multi-dimensional zero-finding. There is a wide literature outlining the properties of these methods and illustrating their performance [e.g., [8]]. In addition, most modern optimization libraries house a quasi-Newt...
متن کاملApproximate Gauss-Newton Methods for Nonlinear Least Squares Problems
The Gauss–Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an...
متن کاملTensor Methods for Large, Sparse Nonlinear Least Squares Problems
This paper introduces tensor methods for solving large, sparse nonlinear least squares problems where the Jacobian either is analytically available or is computed by nite diier-ence approximations. Tensor methods have been shown to have very good computational performance for small to medium-sized, dense nonlinear least squares problems. In this paper we consider the application of tensor metho...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Computational and Applied Mathematics
سال: 2010
ISSN: 0377-0427
DOI: 10.1016/j.cam.2009.12.031